Automatic differentiation and the step computation in the limited memory BFGS method

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Automatic Differentiation and the Step Computation in the Limited Memory Bfgs Method

It is shown that the two-loop recursion for computing the search direction of a limited memory method for optimization can be derived by means of the reverse mode of automatic differentiation applied to an auxiliary function.

متن کامل

ON THE LIMITED MEMORY BFGS METHOD FORLARGE SCALE OPTIMIZATIONbyDong

We study the numerical performance of a limited memory quasi-Newton method for large scale optimization, which we call the L-BFGS method. We compare its performance with that of the method developed by Buckley and LeNir (1985), which combines cyles of BFGS steps and conjugate direction steps. Our numerical tests indicate that the L-BFGS method is faster than the method of Buckley and LeNir, and...

متن کامل

Large-scale Kalman Filtering Using the Limited Memory Bfgs Method

The standard formulations of the Kalman filter (KF) and extended Kalman filter (EKF) require the storage and multiplication of matrices of size n × n, where n is the size of the state space, and the inversion of matrices of size m × m, where m is the size of the observation space. Thus when both m and n are large, implementation issues arise. In this paper, we advocate the use of the limited me...

متن کامل

On the limited memory BFGS method for large scale optimization

We study the numerical performance of a limited memory quasi Newton method for large scale optimization which we call the L BFGS method We compare its performance with that of the method developed by Buckley and LeNir which combines cyles of BFGS steps and conjugate direction steps Our numerical tests indicate that the L BFGS method is faster than the method of Buckley and LeNir and is better a...

متن کامل

A regularized limited-memory BFGS method for unconstrained minimization problems

The limited-memory BFGS (L-BFGS) algorithm is a popular method of solving large-scale unconstrained minimization problems. Since LBFGS conducts a line search with the Wolfe condition, it may require many function evaluations for ill-posed problems. To overcome this difficulty, we propose a method that combines L-BFGS with the regularized Newton method. The computational cost for a single iterat...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Applied Mathematics Letters

سال: 1993

ISSN: 0893-9659

DOI: 10.1016/0893-9659(93)90032-i